6 research outputs found

    Methods of Uncertainty Quantification for Physical Parameters

    Get PDF
    Uncertainty Quantification (UQ) is an umbrella term referring to a broad class of methods which typically involve the combination of computational modeling, experimental data and expert knowledge to study a physical system. A parameter, in the usual statistical sense, is said to be physical if it has a meaningful interpretation with respect to the physical system. Physical parameters can be viewed as inherent properties of a physical process and have a corresponding true value. Statistical inference for physical parameters is a challenging problem in UQ due to the inadequacy of the computer model. In this thesis, we provide a comprehensive overview of the existing relevant UQ methodology. The computer model is often time consuming, proprietary or classified and therefore a cheap-to-evaluate emulator is needed. When the input space is large, Gaussian process (GP) emulation may be infeasible and the predominant local GP framework is too slow for prediction when MCMC is used for posterior sampling. We propose two modifications to this LA-GP framework which can be used to construct a cheap-to-evaluate emulator for the computer model, offering the user a simple and flexible time for memory exchange. When the field data consist of measurements across a set of experiments, it is common for a set of computer model inputs to represent measurements of a physical component, recorded with error. When this structure is present, we propose a new metric for identifying overfitting and a related regularization prior distribution. We show that these parameters lead to improved inference for compressibility parameters of tantalum. We propose an approximate Bayesian framework, referred to as modularization, which is shown to be useful for exploring dependencies between physical and nuisance parameters, with respect to the inadequacy of the computer model and the available prior information. We discuss a cross validation framework, modified to account for spatial (or temporal) structure, and show that it can aid in the construction of empirical Bayes priors for the model discrepancy. This CV framework can be coupled with modularization to assess the sensitivity of physical parameters to the discrepancy related modeling choices

    Estimation of

    Get PDF
    Bayesian model calibration has become a powerful tool for the analysis of experimental data coupled with a physics-based mathematical model. The forward problem of prediction, especially within the range of data, is generally well-posed. There are many well-known issues with the approach when solving the inverse problem of parameter estimation, especially when the calibration parameters have physical interpretations. In this poster, we explore several techniques to identify and overcome these challenges. First, we consider regularization, which refers to the process of constraining the solution space in a meaningful and reasonable way. This is accomplished via the Moment Penalization prior distribution and the associated probability of prior coherency. Secondly, we consider a pseudo-Bayesian approach which we refer to as modularization. By focusing on a small number of parameters which are considered of-interest and forfeiting the ability to learn about the remaining parameters, robust inferential procedures can sometimes be obtained. These ideas are illustrated using several simple examples and a dynamic material property application where material properties of Tantalum are estimated

    Generalized Bayesian MARS: Tools for Emulating Stochastic Computer Models

    Full text link
    The multivariate adaptive regression spline (MARS) approach of Friedman (1991) and its Bayesian counterpart (Francom et al. 2018) are effective approaches for the emulation of computer models. The traditional assumption of Gaussian errors limits the usefulness of MARS, and many popular alternatives, when dealing with stochastic computer models. We propose a generalized Bayesian MARS (GBMARS) framework which admits the broad class of generalized hyperbolic distributions as the induced likelihood function. This allows us to develop tools for the emulation of stochastic simulators which are parsimonious, scalable, interpretable and require minimal tuning, while providing powerful predictive and uncertainty quantification capabilities. GBMARS is capable of robust regression with t distributions, quantile regression with asymmetric Laplace distributions and a general form of "Normal-Wald" regression in which the shape of the error distribution and the structure of the mean function are learned simultaneously. We demonstrate the effectiveness of GBMARS on various stochastic computer models and we show that it compares favorably to several popular alternatives

    Discovering Active Subspaces for High-Dimensional Computer Models

    Full text link
    Dimension reduction techniques have long been an important topic in statistics, and active subspaces (AS) have received much attention this past decade in the computer experiments literature. The most common approach towards estimating the AS is to use Monte Carlo with numerical gradient evaluation. While sensible in some settings, this approach has obvious drawbacks. Recent research has demonstrated that active subspace calculations can be obtained in closed form, conditional on a Gaussian process (GP) surrogate, which can be limiting in high-dimensional settings for computational reasons. In this paper, we produce the relevant calculations for a more general case when the model of interest is a linear combination of tensor products. These general equations can be applied to the GP, recovering previous results as a special case, or applied to the models constructed by other regression techniques including multivariate adaptive regression splines (MARS). Using a MARS surrogate has many advantages including improved scaling, better estimation of active subspaces in high dimensions and the ability to handle a large number of prior distributions in closed form. In one real-world example, we obtain the active subspace of a radiation-transport code with 240 inputs and 9,372 model runs in under half an hour

    Mucin and Agitation Shape Predation of <i>Escherichia coli</i> by Lytic Coliphage

    No full text
    The ability of bacteriophage (phage), abundant within the gastrointestinal microbiome, to regulate bacterial populations within the same micro-environment offers prophylactic and therapeutic opportunities. Bacteria and phage have both been shown to interact intimately with mucin, and these interactions invariably effect the outcomes of phage predation within the intestine. To better understand the influence of the gastrointestinal micro-environment on phage predation, we employed enclosed, in vitro systems to investigate the roles of mucin concentration and agitation as a function of phage type and number on bacterial killing. Using two lytic coliphage, T4 and PhiX174, bacterial viability was quantified following exposure to phages at different multiplicities of infection (MOI) within increasing, physiological levels of mucin (0–4%) with and without agitation. Comparison of bacterial viability outcomes demonstrated that at low MOI, agitation in combination with higher mucin concentration (>2%) inhibited phage predation by both phages. However, when MOI was increased, PhiX predation was recovered regardless of mucin concentration or agitation. In contrast, only constant agitation of samples containing a high MOI of T4 demonstrated phage predation; briefly agitated samples remained hindered. Our results demonstrate that each phage–bacteria pairing is uniquely influenced by environmental factors, and these should be considered when determining the potential efficacy of phage predation under homeostatic or therapeutic circumstances
    corecore